Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
128k long-text processing
# 128k long-text processing
Mistral Nemo Base 2407
Apache-2.0
Mistral-Nemo-Base-2407 is a 12-billion-parameter generative text pre-training model jointly trained by Mistral AI and NVIDIA, outperforming existing models of similar or smaller scale.
Large Language Model
Transformers
Supports Multiple Languages
M
mistralai
44.76k
304
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase